Sparse Prediction with the $k$-Support Norm

نویسندگان

  • Andreas Argyriou
  • Rina Foygel
  • Nathan Srebro
چکیده

We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an `2 penalty. We show that this new k-support norm provides a tighter relaxation than the elastic net and can thus be advantageous in in sparse prediction problems. We also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient k-Support-Norm Regularized Minimization via Fully Corrective Frank-Wolfe Method

The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost thus the model solving could be time consuming. In our work, we reformulate the k-support-norm regularized formulation into a constrained fo...

متن کامل

New Perspectives on k-Support and Cluster Norms

We study a regularizer which is defined as a parameterized infimum of quadratics, and which we call the box-norm. We show that the k-support norm, a regularizer proposed by Argyriou et al. (2012) for sparse vector prediction problems, belongs to this family, and the box-norm can be generated as a perturbation of the former. We derive an improved algorithm to compute the proximity operator of th...

متن کامل

Spectral k-Support Norm Regularization

The k-support norm has successfully been applied to sparse vector prediction problems. We observe that it belongs to a wider class of norms, which we call the box-norms. Within this framework we derive an efficient algorithm to compute the proximity operator of the squared norm, improving upon the original method for the k-support norm. We extend the norms from the vector to the matrix setting ...

متن کامل

Sparse Prediction with the k-Overlap Norm

We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an l2 penalty and can also be interpreted as a group Lasso norm with overlaps. We show that this new norm provides a tighter relaxation than the elastic net and suggest using it as a replacement for the Lasso or the elastic net in sparse prediction problems.

متن کامل

Approximating Combined Discrete Total Variation and Correlated Sparsity With Convex Relaxations

The recently introduced k-support norm has been successfully applied to sparse prediction problems with correlated features. This norm however lacks any explicit structural constraints commonly found in machine learning and image processing. We address this problem by incorporating a total variation penalty in the k-support framework. We introduce the (k, s) support total variation norm as the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012